Rosenbrock methods

Rosenbrock methods may refer to either of two distinct ideas in numerical computation, both named for Howard H. Rosenbrock. Rosenbrock optimization methods are a family of numerical optimization algorithms applicable to optimization problems in which the objective function is inexpensive to compute yet and the explicit derivative cannot be computed efficiently.[1] Rosenbrock methods for stiff differential equations are methods for solving ordinary differential equations that contain a wide range of characteristic timescales.[2]

Rosenbrock optimization methods are related to Nelder-Mead methods, but with better convergence properties.

See also

References

  1. ^ H. H. Rosenbrock, "An Automatic Method for Finding the Greatest or Least Value of a Function", The Computer Journal (1960) 3(3): 175-184
  2. ^ H. H. Rosenbrock, "Some general implicit processes for the numerical solution of differential equations", The Computer Journal (1963) 5(4): 329-330

External links